20 research outputs found
An Exploratory Analysis of the Neural Correlates of Human-Robot Interactions With Functional Near Infrared Spectroscopy
Functional near infrared spectroscopy (fNIRS) has been gaining increasing interest as a practical mobile functional brain imaging technology for understanding the neural correlates of social cognition and emotional processing in the human prefrontal cortex (PFC). Considering the cognitive complexity of human-robot interactions, the aim of this study was to explore the neural correlates of emotional processing of congruent and incongruent pairs of human and robot audio-visual stimuli in the human PFC with fNIRS methodology. Hemodynamic responses from the PFC region of 29 subjects were recorded with fNIRS during an experimental paradigm which consisted of auditory and visual presentation of human and robot stimuli. Distinct neural responses to human and robot stimuli were detected at the dorsolateral prefrontal cortex (DLPFC) and orbitofrontal cortex (OFC) regions. Presentation of robot voice elicited significantly less hemodynamic response than presentation of human voice in a left OFC channel. Meanwhile, processing of human faces elicited significantly higher hemodynamic activity when compared to processing of robot faces in two left DLPFC channels and a left OFC channel. Significant correlation between the hemodynamic and behavioral responses for the face-voice mismatch effect was found in the left OFC. Our results highlight the potential of fNIRS for unraveling the neural processing of human and robot audio-visual stimuli, which might enable optimization of social robot designs and contribute to elucidation of the neural processing of human and robot stimuli in the PFC in naturalistic conditions
EEG theta and Mu oscillations during perception of human and robot actions.
The perception of others' actions supports important skills such as communication, intention understanding, and empathy. Are mechanisms of action processing in the human brain specifically tuned to process biological agents? Humanoid robots can perform recognizable actions, but can look and move differently from humans, and as such, can be used in experiments to address such questions. Here, we recorded EEG as participants viewed actions performed by three agents. In the Human condition, the agent had biological appearance and motion. The other two conditions featured a state-of-the-art robot in two different appearances: Android, which had biological appearance but mechanical motion, and Robot, which had mechanical appearance and motion. We explored whether sensorimotor mu (8-13 Hz) and frontal theta (4-8 Hz) activity exhibited selectivity for biological entities, in particular for whether the visual appearance and/or the motion of the observed agent was biological. Sensorimotor mu suppression has been linked to the motor simulation aspect of action processing (and the human mirror neuron system, MNS), and frontal theta to semantic and memory-related aspects. For all three agents, action observation induced significant attenuation in the power of mu oscillations, with no difference between agents. Thus, mu suppression, considered an index of MNS activity, does not appear to be selective for biological agents. Observation of the Robot resulted in greater frontal theta activity compared to the Android and the Human, whereas the latter two did not differ from each other. Frontal theta thus appears to be sensitive to visual appearance, suggesting agents that are not sufficiently biological in appearance may result in greater memory processing demands for the observer. Studies combining robotics and neuroscience such as this one can allow us to explore neural basis of action processing on the one hand, and inform the design of social robots on the other
Observation and imitation of actions performed by humans, androids, and robots : an EMG study
Understanding others’ actions is essential for functioning in the physical and social world. In the past two decades research has shown that action perception involves the motor system, supporting theories that we understand others’ behavior via embodied motor simulation. Recently, empirical approach to action perception has been facilitated by using well-controlled artificial stimuli, such as robots. One broad question this approach can address is what aspects of similarity between the observer and the observed agent facilitate motor simulation. Since humans have evolved among other humans and animals, using artificial stimuli such as robots allows us to probe whether our social perceptual systems are specifically tuned to process other biological entities. In this study, we used humanoid robots with different degrees of human-likeness in appearance and motion along with electromyography (EMG) to measure muscle activity in participants’ arms while they either observed or imitated videos of three agents produce actions with their right arm. The agents were a Human (biological appearance and motion), a Robot (mechanical appearance and motion), and an Android (biological appearance and mechanical motion). Right arm muscle activity increased when participants imitated all agents. Increased muscle activation was found also in the stationary arm both during imitation and observation. Furthermore, muscle activity was sensitive to motion dynamics: activity was significantly stronger for imitation of the human than both mechanical agents. There was also a relationship between the dynamics of the muscle activity and motion dynamics in stimuli. Overall our data indicate that motor simulation is not limited to observation and imitation of agents with a biological appearance, but is also found for robots. However we also found sensitivity to human motion in the EMG responses. Combining data from multiple methods allows us to obtain a more complete picture of action understanding and the underlying neural computations
Recommended from our members
Biological motion perception in perceptual decision-making framework: ERP evidence in humans
Neurophysiological studies in non-human primates suggest that perceptual decision-making consists of two stages of information processing: sensory evidence accumulation and response selection. Recent work with humans shows that the sensory evidence accumulation process can be tracked with the CPP component derived from EEG. As most studies in the field use simple motion stimuli, it remains unclear whether these processes generalize to more complex and socially important stimuli such as biological motion. In the present study, we used point-light displays with 4 levels of coherence and recorded EEG as human subjects (N=14) performed a perceptual decision-making task. Our results show that biological motion elicited a CPP component whose peak rate tracks the coherence level of the stimuli, albeit with a later onset than observed previously. These results suggest that similar decision-making mechanisms may play a role in biological motion perception
Recommended from our members
Visual perception of mechanical motion: A comparison of methods that disrupt biological motion
Previous work shows that visual perception of biological motion is supported by a specialized neural system and is distinct from non-biological or mechanical motion. However, there has been no systematic characterization of what construes mechanical motion, especially in the context of complex actions performed by humans and non-human agents like robots. In the present study, we proposed four different methods that disrupt biological motion and investigated to what extent the altered motion stimuli were perceived mechanical by human observers (N=230). The methods manipulated the pattern of motion trajectories, their timing, and the number of the limbs used in the performed actions. The results show that the motion stimuli were found most mechanical when the pattern of the motion trajectory was changed. We also found that as the number of limbs whose motion trajectory was changed increased, the perceived mechanicalness increased
The unique role of parietal cortex in action observation: Functional organization for communicative and manipulative actions
Action observation is supported by a network of regions in occipito-temporal, parietal, and premotor cortex in primates. Recent research suggests that the parietal node has regions dedicated to different action classes including manipulation, interpersonal interactions, skin displacement, locomotion, and climbing. The goals of the current study consist of: 1) extending this work with new classes of actions that are communicative and specific to humans, 2) investigating how parietal cortex differs from the occipito-temporal and premotor cortex in representing action classes. Human subjects underwent fMRI scanning while observing three action classes: indirect communication, direct communication, and manipulation, plus two types of control stimuli, static controls which were static frames from the video clips, and dynamic controls consisting of temporally-scrambled optic flow information. Using univariate analysis, MVPA, and representational similarity analysis, our study presents several novel findings. First, we provide further evidence for the anatomical segregation in parietal cortex of different action classes: We have found a new site that is specific for representing human-specific indirect communicative actions in cytoarchitectonic parietal area PFt. Second, we found that the discriminability between action classes was higher in parietal cortex than the other two levels suggesting the coding of action identity information at this level. Finally, our results advocate the use of the control stimuli not just for univariate analysis of complex action videos but also when using multivariate techniques
Recommended from our members
Top-down effects of attention on the action observation network
Action perception plays a crucial role in human life and is significant both evolutionarily and socially. It is supported by the well-established action observation network (AON), which consists of three core nodes in the occipitotemporal, parietal, and pre-motor cortex of the human brain. In the present study, we investigated how top-down attention affects the AON using fMRI and representational similarity analysis. In the first session, human participants viewed short videos while they performed tasks in which they attended different features of actions in separate blocks, including the actor, the target, or the effector. In a second fMRI session, participants viewed the videos in a passive manner to be able to compare the neural representations in the AON across passive and active attention tasks. Our results show that attention to different features of observed actions modulates the nodes of the AON differently indicating their distinct roles in supporting action perception
Recommended from our members
Biological Motion Perception under Attentional Load
Biological motion perception is supported by a network of regions in the occipito-temporal areas, primarily in superior temporal sulcus (STS), and premotor cortex (PMC). How biological motion is processed outside the focus of attention and whether it is modulated by attentional load remain unknown. We investigated the bottom-up processing of biological motion under different levels of attentional load (high vs. low) with functional magnetic resonance imaging (N=13). In line with previous work, we found that fronto-parietal attention regions were significantly more activated when the attentional load was high than when it was low. Importantly, biological motion under low attentional load yielded activity in STS and PMC, whereas biological motion under high load was restricted only to the low-level motion sensitive areas. These results show that biological motion is processed outside the focus of attention and it is modulated by attentional load